87 research outputs found

    A Practical, Accurate, Information Criterion for Nth Order Markov Processes

    Get PDF
    The recent increase in the breath of computational methodologies has been matched with a corresponding increase in the difficulty of comparing the relative explanatory power of models from different methodological lineages. In order to help address this problem a Markovian information criterion (MIC) is developed that is analogous to the Akaike information criterion (AIC) in its theoretical derivation and yet can be applied to any model able to generate simulated or predicted data, regardless of its methodology. Both the AIC and proposed MIC rely on the Kullback–Leibler (KL) distance between model predictions and real data as a measure of prediction accuracy. Instead of using the maximum likelihood approach like the AIC, the proposed MIC relies instead on the literal interpretation of the KL distance as the inefficiency of compressing real data using modelled probabilities, and therefore uses the output of a universal compression algorithm to obtain an estimate of the KL distance. Several Monte Carlo tests are carried out in order to (a) confirm the performance of the algorithm and (b) evaluate the ability of the MIC to identify the true data-generating process from a set of alternative models

    Searching methods for biometric identification systems: Fundamental limits

    Get PDF
    We study two-stage search procedures for biometric identification systems in an information-theoretical setting. Our main conclusion is that clustering based on vector-quantization achieves the optimum trade-off between the number of clusters (cluster rate) and the number of individuals within a cluster (refinement rate). The notion of excess rate is introduced, a parameter which relates to the amount of clusters to which the individuals belong. We demonstrate that noisier observation channels lead to larger excess rates. © 2009 IEEE

    Signaling for the Gaussian Channel with side information at the transmitter

    No full text
    We investigate the Gaussian side information channel in the Shannon (1958) setup and propose a method that achieves correlation between the signal and the side information noise. The capacity still remains to be determined however

    An informationtheoretical approach to information embedding

    No full text

    The discrete memoryless multiple access channel with partially cooperating encoders

    No full text
    \u3cp\u3eWe introduce the communication situation in which the encoders of a multiple access channel are partially cooperating. These encoders are connected by communication links with finite capacities, which permit both encoders to communicate with each other. First we give a general definition of such a communication process (conference). Then, by proving a converse and giving an achievability proof, we establish the capacity region of the multiple access channel with partially cooperating encoders. It turns out that the optimal conference is very simple.\u3c/p\u3

    Information theory and its application to optical communication

    Get PDF
    \u3cp\u3eThe lecture focusses on the foundations of communication which were developed within the field of information theory. Enumerative shaping techniques and the so-called squareroot transform will be discussed in detail.\u3c/p\u3

    Two results for the multiple access channel with feedback

    No full text

    Enumerative modulation techniques

    No full text
    • …
    corecore